Reliable detection of cognitive load would benefit the design of intelligent assistive navigation aids for the visually impaired\n(VIP). Ten participants with various degrees of sight loss navigated in unfamiliar indoor and outdoor environments, while their\nelectroencephalogram (EEG) and electrodermal activity (EDA) signals were being recorded. In this study, the cognitive load of\nthe tasks was assessed in real time based on a modification of the well-established event-related (de)synchronization (ERD/ERS)\nindex.We present an in-depth analysis of the environments that mostly challenge people fromcertain categories of sight loss and we\npresent an automatic classification of the perceived difficulty in each time instance, inferred fromtheir biosignals. Given the limited\nsize of our sample, our findings suggest that there are significant differences across the environments for the various categories of\nsight loss.Moreover, we exploit cross-modal relations predicting the cognitive load in real time inferring on features extracted from\nthe EDA. Such possibility paves the way for the design on less invasive, wearable assistive devices that take into consideration the\nwell-being of the VIP.
Loading....